Perfect Recovery Conditions for Non-negative Sparse Modeling
نویسندگان
چکیده
منابع مشابه
Non-negative Sparse Modeling of Textures
This paper presents a statistical model for textures that uses a non-negative decomposition on a set of local atoms learned from an exemplar. This model is described by the variances and kurtosis of the marginals of the decomposition of patches in the learned dictionary. A fast sampling algorithm allows to draw a typical image from this model. The resulting texture synthesis captures the geomet...
متن کاملSparse Non-negative Matrix Language Modeling
We present Sparse Non-negative Matrix (SNM) estimation, a novel probability estimation technique for language modeling that can efficiently incorporate arbitrary features. We evaluate SNM language models on two corpora: the One Billion Word Benchmark and a subset of the LDC English Gigaword corpus. Results show that SNM language models trained with n-gram features are a close match for the well...
متن کاملSparse non-negative matrix language modeling for skip-grams
We present a novel family of language model (LM) estimation techniques named Sparse Non-negative Matrix (SNM) estimation. A first set of experiments empirically evaluating these techniques on the One Billion Word Benchmark [3] shows that with skip-gram features SNMLMs are able to match the state-of-theart recurrent neural network (RNN) LMs; combining the two modeling techniques yields the best ...
متن کاملSupplement to ’ Sparse recovery by thresholded non - negative least squares ’
We here provide additional proofs, definitions, lemmas and derivations omitted in the paper. Note that material contained in the latter are referred to by the captions used there (e.g. Theorem 1), whereas auxiliary statements contained exclusively in this supplement are preceded by a capital Roman letter (e.g. Theorem A.1). A Sub-Gaussian random variables and concentration inequalities A random...
متن کاملSparse recovery by thresholded non-negative least squares
Non-negative data are commonly encountered in numerous fields, making nonnegative least squares regression (NNLS) a frequently used tool. At least relative to its simplicity, it often performs rather well in practice. Serious doubts about its usefulness arise for modern high-dimensional linear models. Even in this setting − unlike first intuition may suggest − we show that for a broad class of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2017
ISSN: 1053-587X,1941-0476
DOI: 10.1109/tsp.2016.2613067